# Deep Reasoning Optimization

Deepseek R1 0528
MIT
The DeepSeek R1 model has undergone a minor version upgrade, with the current version being DeepSeek-R1-0528. In the latest update, DeepSeek R1 significantly enhanced its deep reasoning and inference capabilities by increasing computational resource investment and introducing algorithmic optimization mechanisms during the post-training phase.
Large Language Model Transformers
D
deepseek-ai
4,556
1,249
GLM Z1 9B 0414 GGUF
MIT
GLM-Z1-9B-0414 is a 9B-parameter open-source model from the GLM family, specializing in mathematical reasoning and general task capabilities, excelling in resource-constrained scenarios.
Large Language Model Supports Multiple Languages
G
unsloth
2,258
5
THUDM GLM 4 32B 0414 6.5bpw H8 Exl2
MIT
GLM-4-32B-0414 is a new member of the GLM family with a parameter scale of 32 billion, comparable in performance to the GPT series, and supports local deployment.
Large Language Model Transformers Supports Multiple Languages
T
LatentWanderer
148
2
GLM 4 32B 0414 Bnb 4bit
MIT
GLM-4-32B-0414 is a new member of the GLM family, featuring 32 billion parameters with performance comparable to the GPT series and DeepSeek-V3 series, supporting local deployment.
Large Language Model Transformers Supports Multiple Languages
G
unsloth
41
2
GLM 4 32B 0414 Unsloth Bnb 4bit
MIT
GLM-4-32B-0414 is a new member of the GLM family, featuring 32 billion parameters, with performance comparable to the GPT and DeepSeek series, and supports local deployment.
Large Language Model Transformers Supports Multiple Languages
G
unsloth
87
2
GLM 4 32B 0414 GGUF
MIT
GLM-4-32B-0414 is a large language model with 32 billion parameters, comparable in performance to GPT-4o and DeepSeek-V3. It supports both Chinese and English, and excels in code generation, function calling, and complex task processing.
Large Language Model Supports Multiple Languages
G
unsloth
4,680
10
GLM 4 32B 0414
MIT
GLM-4-32B-0414 is a new member of the GLM family with 32 billion parameters, offering performance comparable to GPT-4o and DeepSeek-V3, and supports local deployment.
Large Language Model Transformers Supports Multiple Languages
G
unsloth
101
3
GLM 4 32B 0414
MIT
GLM-4-32B-0414 is a large language model with 32 billion parameters, comparable in performance to the GPT series, supporting both Chinese and English, and excels in code generation, function calling, and complex task processing.
Large Language Model Transformers Supports Multiple Languages
G
THUDM
10.91k
320
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase